Home |
| Latest | About | Random
# 22 Elementary row matrices. Let us just go back briefly to elementary row operations, remember there are three **reversible** kinds (1) swap, (2) nonzero scaling, and (3) replacement. But just exactly what is going on? As it turns out when we perform $$ A \stackrel{\text{row}}\sim \tilde A $$we are in fact multiplying $A$ on the left by some matrix to obtain $\tilde A$, that we have some $E$ such that $$ E A = \tilde A $$Here if $A,\tilde A$ are $n\times k$ matrices, then $E$ is a square $n\times n$ matrix. What is this matrix $E$? As it turns out it is the matrix you get by applying the same elementary row operation on the identity matrix $I_{n\times n}$. We have the following proposition: > **Proposition.** Associated elementary row matrices to elementary row operations. > If we perform an **elementary row operation** $\epsilon$ to an $n\times k$ matrix $A$, and we obtain $\tilde A$, that $$ A \stackrel{\epsilon}\sim \tilde A, $$then we have $EA = \tilde A$ where $E$ is an $n\times n$ matrix given by applying the **same elementary row operation $\epsilon$** on the identity matrix $I_{n\times n}$. We call such matrix $E$ obtained by an elementary row operation on the identity matrix an **elementary row matrix**. Recall the identity matrix looks like this: $I_{n\times n} = \begin{pmatrix}1\\ & 1\\ & & 1\\ & & & \ddots\\ & & & & 1\end{pmatrix}$. Let us illustrate with some examples showing these three kinds of elementary row operations. **Example.** Suppose $A = \begin{pmatrix}2 & 3 \\4 & 7\end{pmatrix}$, and we perform $R_{1} \leftrightarrow R_{2}$ to obtain $\tilde A = \begin{pmatrix}4 & 7\\2 & 3\end{pmatrix}$, then $EA = \tilde A$ where $$ E = \begin{pmatrix} 0 & 1\\1 & 0\end{pmatrix}. $$This $E$ is obtained by applying $R_{1}\leftrightarrow R_{2}$ to the $2\times 2$ identity matrix $I_{2\times 2} = \begin{pmatrix}1 & 0\\0 & 1\end{pmatrix}$. You should verify indeed we have $EA = \tilde A$! **Example.** Suppose $A = \begin{pmatrix}2 & 3\\4 & 7\end{pmatrix}$, and we perform $R_{2}\to 5 R_{2}$ to obtain $\tilde A =\begin{pmatrix}2 & 3\\20 & 35\end{pmatrix}$, then $EA = \tilde A$ where $$ E = \begin{pmatrix}1 & 0\\0 & 5\end{pmatrix}. $$ **Example.** Suppose $A = \begin{pmatrix}2 & 3\\4 & 7\end{pmatrix}$, and we perform $R_{2}\to R_{2} - 2R_{1}$ to obtain $\tilde A = \begin{pmatrix}2 & 3\\0 & 1\end{pmatrix}$, then $EA = \tilde A$ where $$ E = \begin{pmatrix}1 & 0\\-2 & 1\end{pmatrix}. $$ Note in each case we are just performing the elementary row operation on the identity matrix to obtain the elementary row matrix. Don't just take my word for it, verify each product above! So, if we have some $n\times k$ matrix $A$, and you performed some elementary row operation $\epsilon$ to obtain $\tilde A$, $$ A\xrightarrow{\epsilon} \tilde A $$then we have $EA = \tilde A$, where $E$ is the elementary row matrix obtained by using the **same** operation $\epsilon$ on the identity matrix $I$, that is $$ I\xrightarrow{\epsilon} E. $$ And we can chain them together as well. That is, > If we perform a **sequence of elementary row operations** on matrix $A$, first by elementary row operation $\epsilon_{1}$, then $\epsilon_{2}$, then $\epsilon_{3}$, then...., then finally $\epsilon_{p}$, to obtain a matrix $A_{p}$: $$ A \xrightarrow{\epsilon_{1}}A_{1}\xrightarrow{\epsilon_{2}}A_{2}\xrightarrow{\epsilon_{3}}\cdots\xrightarrow{\epsilon_{p}}A_{p}, $$then we have the product $$ E_{p}\cdots E_{3}E_{2}E_{1}A = A_{p}, $$where $E_{i}$ is the corresponding elementary row matrix associated to the elementary row operation $\epsilon_{i}$. Since each each elementary row operation is reversible, if $E$ is the elementary row matrix associated to some elementary row operation $\epsilon$, let us denote $E^{-1}$ to be elementary row matrix associated with the **reverse** elementary row operation $\epsilon^{-1}$ to $\epsilon$. We say $E^{-1}$ is the **inverse** of $E$. So we have > $$ A\xrightarrow{\epsilon}\tilde A\quad \text{with } EA=\tilde A $$if and only if $$ A\xrightarrow{\epsilon^{-1}}\tilde A\quad\text{with } A=E^{-1}\tilde A $$ ## Matrix factorization. This is neat, but why do we care? This provides a way to factorize a matrix. In particular, we can relate write down how a matrix $A$ is related to its **reduced row echelon form**. **Example.** Suppose we have the following sequence of elementary row operations of the matrix $A = \begin{pmatrix}2 & 3\\4 & 7\end{pmatrix}$ into a **reduced row echelon form**: $$ \begin{array}{rcl} & A = \\ & \begin{pmatrix}2 & 3 \\4 & 7\end{pmatrix} \\ \color{#ff0000}\epsilon_{1}:{R_{2}\to R_{2}-2R_{1}} \downarrow & & \color{#0000ff}\uparrow \epsilon_{1}^{-1}: {R_{2}\to R_{2}+2R_{1}} \\ & \begin{pmatrix}2 & 3\\0 & 1\end{pmatrix} \\ \color{#ff4444}\epsilon_{2}:{R_{1}\to R_{1} -3R_{2}} \downarrow & &\color{#4444ff} \uparrow\epsilon_{2}^{-1}: {R_{1}\to R_{1} +3R_{2}}\\ & \begin{pmatrix}2 & 0\\0 & 1\end{pmatrix} \\ \color{#ff8888}\epsilon_{3}:{R_{1}\to \frac{1}{2} R_{1}} \downarrow & & \color{#8888ff}\uparrow \epsilon_{3}^{-1}: {R_{1}\to 2 R_{1}} \\ & \begin{pmatrix}1 & 0\\0 & 1\end{pmatrix}\\ \end{array} $$This shows that $$ {\color{#f88}E_{3}}{\color{#f44}E_{2}}{\color{#f00}E_{1}}A = \begin{pmatrix}1 & 0\\0 & 1\end{pmatrix} $$where $E_{1}$ is the elementary row matrix associated $R_{2}\to R_{2}-2R_{1}$ applied to $A$, so $$ \color{#ff0000}E_{1}=\begin{pmatrix}1 & 0\\-2 & 1\end{pmatrix} $$And similarly $E_{2}$ is associated with $R_{1}\to R_{1}-3R_{2}$, applied further to the product, given by $$ \color{#ff4444}E_{2} = \begin{pmatrix}1 & -3\\0 & 1\end{pmatrix} $$And lastly $E_{3}$ is associated with $R_{1}\to \frac{1}{2}R_{1}$ given by $$ \color{#ff8888}E_{3} = \begin{pmatrix} \frac{1}{2} & 0 \\0 & 1\end{pmatrix}. $$So, we have $$ \underbrace{\color{#ff8888}\begin{pmatrix} \frac{1}{2} & 0 \\0 & 1\end{pmatrix}}_{E_{3}}\ \underbrace{\color{#ff4444}\begin{pmatrix}1 & -3\\0 & 1\end{pmatrix}}_{E_{2}}\ \underbrace{\color{#ff0000}\begin{pmatrix}1 & 0\\-2 & 1\end{pmatrix}}_{E_{1}}\ \underbrace{\begin{pmatrix}2 & 3\\4 & 7\end{pmatrix}}_{A} = \begin{pmatrix}1 & 0\\0 & 1\end{pmatrix} $$ **Remark.** You should multiply above out to see this equality is indeed true! **Remark.** Pay attention to the order of how these matrices are multiplied! **Remark.** Here we have expressed something multiplied to $A$ to get its reduced row echelon form, but what we often want is to **factorize** $A$ as a product. Can you express it the other way, that $A$ equals some matrices multiplied to its reduced row echelon form? By looking at above, we can work backwards, we can reverse all the arrows and reverse each elementary row operation easily, which gives: $$ A = \begin{pmatrix}2 & 3\\4 & 7\end{pmatrix} = \underbrace{\color{#00f}\begin{pmatrix}1 & 0\\2 & 1\end{pmatrix} }_{E_{1}^{-1}}\ \underbrace{\color{#44f}\begin{pmatrix}1 & 3\\0 & 1\end{pmatrix}}_{E_{2}^{-1}}\ \underbrace{\color{#88f}\begin{pmatrix}2 & 0\\0 & 1\end{pmatrix}}_{E_{3}^{-1}}\ \begin{pmatrix}1 & 0\\0 & 1\end{pmatrix}. $$We didn't need to do extra work here once we have the row reduction above. And also again, pay attention to the order of these matrices: When you reverse to go other way, you are reversing what was done last first, a **socks and shoes principle**! **Bottomline.** Using these elementary row matrices, we can **factorize** a matrix $A$ as a product of elementary matrices with its **reduced row echelon form**. This is a powerful idea that we will use later. **Example.** Factorize the matrix $A = \begin{pmatrix}3 & 2 & 1 \\ 4 & 3 & 2\end{pmatrix}$ as a product of elementary matrices and a matrix that is in reduced row echelon form (RREF) that is row equivalent to $A$. We first perform elementary row operations to $A$ to obtain its RREF, and recording each step down $$ \begin{array}{rcl} \\ & A= & \\ &\begin{pmatrix}3 & 2 & 1\\4 & 3 & 2\end{pmatrix}& \\ \color{#f00} R_{1} \to \frac{1}{3} R_{1} \downarrow & &\color{#00f}\uparrow R_{1} \to 3 R_{1}\\ &\begin{pmatrix}1 & \frac{2}{3} & \frac{1}{3}\\4 & 3 & 2\end{pmatrix} \\ \color{#f44} {R_{2}\to R_{2} - 4R_{1}} \downarrow & & \color{#44f}\uparrow {R_{2}\to R_{2} + 4R_{1}}\\ & \begin{pmatrix}1 & \frac{2}{3} & \frac{1}{3} \\ 0 & \frac{1}{3} & \frac{2}{3}\end{pmatrix} \\ \color{#f88} R_{1}\to R_{1}-2R_{2} \downarrow & & \color{#88f}\uparrow {R_{1}\to R_{1}+2R_{2}}\\ & \begin{pmatrix}1 & 0 & -1 \\0 & \frac{1}{3} & \frac{2}{3}\end{pmatrix} \\ \color{#fcc} R_{2}\to 3R_{2} \downarrow & &\color{#ccf}\uparrow {R_{2}\to \frac{1}{3}R_{2}} \\ & \begin{pmatrix}1 & 0 & -1 \\ 0 & 1 & 2\end{pmatrix} \\ & =\text{RREF(A)} \end{array} $$(note there are **many different ways** to obtain the RREF, however: The order of the steps matters, and the RREF you get, as you recall, is unique per given matrix.) Now, as it reads currently, we can write a product of elementary matrices times $A$ to get $\text{RREF}(A)$. So what we want is to reverse each step above show in $\color{blue}\text{blues}$. So we have the factorization of $A$ as a product of elementary matrices and its RREF as follows: $$ A = {\color{#00f} \begin{pmatrix}3 & 0\\0 & 1\end{pmatrix}}{\color{#44f} \begin{pmatrix}1 & 0\\4 & 1\end{pmatrix}}{\color{#88f}\begin{pmatrix}1 & 2 \\0 & 1\end{pmatrix}}{\color{#ccf}\begin{pmatrix}1 & 0\\0 & \frac{1}{3}\end{pmatrix} }\begin{pmatrix}1 & 0 & -1\\0 & 1 & 2\end{pmatrix}. \quad\blacklozenge $$